Sparsity through evolutionary pruning prevents neuronal networks from overfitting

نویسندگان
چکیده

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

An Approach to Reducing Overfitting in FCM with Evolutionary Optimization

Fuzzy clustering methods are conveniently employed in constructing a fuzzy model of a system, but they need to tune some parameters. In this research, FCM is chosen for fuzzy clustering. Parameters such as the number of clusters and the value of fuzzifier significantly influence the extent of generalization of the fuzzy model. These two parameters require tuning to reduce the overfitting in the...

متن کامل

Pre-pruning Classification Trees to Reduce Overfitting in Noisy Domains

The automatic induction of classification rules from examples in the form of a classification tree is an important technique used in data mining. One of the problems encountered is the overfitting of rules to training data. In some cases this can lead to an excessively large number of rules, many of which have very little predictive value for unseen data. This paper describes a means of reducin...

متن کامل

Using J-pruning to reduce overfitting in classification trees

The automatic induction of classification rules from examples in the form of a decision tree is an important technique used in data mining. One of the problems encountered is the overfitting of rules to training data. In some cases this can lead to an excessively large number of rules, many of which have very little predictive value for unseen data. This paper is concerned with the reduction of...

متن کامل

cient Synaptic Pruning with Neuronal

Neuronal regulation is a mechanism that was recently found to maintain the homeostasis of the neuron's membrane potential. We show that the operation of this mechanism may lead to a bi-modal distribution of synaptic eecacies, pruning the weak synapses and strengthening the rest. Deriving optimal synaptic modiication functions, we identify conditions under which neuronal regulation leads to near...

متن کامل

Dropout: a simple way to prevent neural networks from overfitting

Deep neural nets with a large number of parameters are very powerful machine learning systems. However, overfitting is a serious problem in such networks. Large networks are also slow to use, making it difficult to deal with overfitting by combining the predictions of many different large neural nets at test time. Dropout is a technique for addressing this problem. The key idea is to randomly d...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Neural Networks

سال: 2020

ISSN: 0893-6080

DOI: 10.1016/j.neunet.2020.05.007